Automatic Differentiation for Riemannian Optimization on Low-Rank Matrix and Tensor-Train Manifolds

نویسندگان

چکیده

In scientific computing and machine learning applications, matrices more general multidimensional arrays (tensors) can often be approximated with the help of low-rank decompositions. Since tensors fixed rank form smooth Riemannian manifolds, one popular tools for finding approximations is to use optimization. Nevertheless, efficient implementation gradients Hessians, required in optimization algorithms, a nontrivial task practice. Moreover, some cases, analytic formulas are not even available. this paper, we build upon automatic differentiation propose method that, given an function minimized, efficiently computes matrix-by-vector products between approximate Hessian vector.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Low-Rank Tensor Completion by Riemannian Optimization∗

In tensor completion, the goal is to fill in missing entries of a partially known tensor under a low-rank constraint. We propose a new algorithm that performs Riemannian optimization techniques on the manifold of tensors of fixed multilinear rank. More specifically, a variant of the nonlinear conjugate gradient method is developed. Paying particular attention to the efficient implementation, ou...

متن کامل

Low-Rank Matrix Completion by Riemannian Optimization

The matrix completion problem consists of finding or approximating a low-rank matrix based on a few samples of this matrix. We propose a novel algorithm for matrix completion that minimizes the least square distance on the sampling set over the Riemannian manifold of fixed-rank matrices. The algorithm is an adaptation of classical non-linear conjugate gradients, developed within the framework o...

متن کامل

Fixed-rank matrix factorizations and Riemannian low-rank optimization

Motivated by the problem of learning a linear regression model whose parameter is a large fixed-rank non-symmetric matrix, we consider the optimization of a smooth cost function defined on the set of fixed-rank matrices. We adopt the geometric framework of optimization on Riemannian quotient manifolds. We study the underlying geometries of several well-known fixed-rank matrix factorizations and...

متن کامل

Efficient tensor completion: Low-rank tensor train

This paper proposes a novel formulation of the tensor completion problem to impute missing entries of data represented by tensors. The formulation is introduced in terms of tensor train (TT) rank which can effectively capture global information of tensors thanks to its construction by a wellbalanced matricization scheme. Two algorithms are proposed to solve the corresponding tensor completion p...

متن کامل

Guarantees of Riemannian Optimization for Low Rank Matrix Completion

We study the Riemannian optimization methods on the embedded manifold of low rank matrices for the problem of matrix completion, which is about recovering a low rank matrix from its partial entries. Assume m entries of an n× n rank r matrix are sampled independently and uniformly with replacement. We first prove that with high probability the Riemannian gradient descent and conjugate gradient d...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: SIAM Journal on Scientific Computing

سال: 2022

ISSN: ['1095-7197', '1064-8275']

DOI: https://doi.org/10.1137/20m1356774